فهرست مطالب

Journal of Medical Signals and Sensors
Volume:1 Issue: 1, Jan-Apr 2011

  • تاریخ انتشار: 1390/03/12
  • تعداد عناوین: 9
|
  • Habibollah Danyali, Alfred Mertins Page 1
    In this paper, an object-based, highly scalable, lossy-to-lossless 3D wavelet coding approach for volumetric medical image data (e.g. MR and CT) is proposed. The new method, called 3DOBHS-SPIHT, is based on the well known set partitioning in hierarchical trees (SPIHT) algorithm and supports both quality and resolution scalability.The 3D input data is grouped into groups of slices (GOS) and each GOS is encoded and decoded as a separate unit. The symmetric tree definition of the original 3DSPIHT is improved by introducing a new asymmetric tree structure. While preserving compression efficiency, the new tree structure allows for a small size of each GOS which not onlyreduces memory consumption during the encoding and decoding processes but also facilitates more efficient random access to certain segments of slices. To achieve more compression efficiency, the algorithm only encodes the main object of interest in each 3D data set, which can have any arbitrary shape, and ignores the unnecessary background. The experimental results on some MR data sets show the good performance of the 3DOBHS-SPIHT algorithm for multiresolution lossy-to-lossless coding. Compression efficiency, full scalability, and object-based features of the proposed approach, beside its lossy-to-lossless coding support, make it a very attractive candidate for volumetric medical image information archiving and transmission applications.
  • Abdol Hamid Pilevar Page 12
    We propose a novel algorithm for the retrieval of images from medical image databases by content. For the best result and efficient representation and retrieval of medical images, the attention is focused on the methodology and the content of medical images is represented by the regions and relationships between such objects or regions Image Attributes (IA) of objects. In this paper, we propose MCBIR (Content-Based Image Retrieval algorithm for Medical image databases), a novel content-based retrieval algorithm that is robust to scaling and translation of objects within an image. MCBIR employs a new model in which each image is first decomposed into regions. The similarity measurement between images is developed based on scheme that integrates properties of all the regions in the images using a regional matching. The method can answer queries by example. The efficiency and performance of the presented method has been evaluated using a dataset of about 5,000 simulated, but realistic computed tomography and magnetic resonance images (MRI) which the original images are selected from three large medical image databases. The results of our experiments show more than 93 percent success that it is satisfactory.
  • S.H. Sabzpoushan, Fateme Pourhasanzade Page 19
    Ventricular fibrillation is the cause of the most sudden deaths. The recent findings have clearly proved the correlation between the slope of restitution curve with ventricular fibrillation. Cellular automata is a powerful tool for simulating complex phenomena in a simple language. In this paper, a simple model is proposed for the simulation of the property of restitution in a single cardiac cell using cellular automata. At first, two state variables; action potential and recovery are introduced in the automata model. In second, automata rule is determined and then recovery variable is defined in such a way so that the restitution is developed. In order to evaluate the proposed model, the generated restitution curve in our study is compared with the restitution curves from the experimental findings of valid sources. Our findings indicate that the presented model is not only capable of simulating restitution in cardiac cell, but also possesses the capability of regulating the restitution curve.
  • Hossein Bolandi, Amir Farhad Ehyaei Page 24
    We design a two-stage scheme to consider trajectory planning problem of two mobile manipulators for cooperative transportation of a rigid body in presence of static obstacles. Within the first stage, regarding the static obstacles we develop a method that searches the workspace for the shortest possible path between start and goal configurations by constructing a graph on a portion of the configuration space that satisfies collision and closure constraints. The final stage is to calculate a sequence of time-optimal trajectories to go between consecutive points of the path regarding the nonholonomic constraints and the maximum allowed joint accelerations. This approach allows geometric constraints such as joint limits and closed-chain constraints along with the differential constraints such as nonholonomic velocity constraints and acceleration limits to be incorporated into the planning scheme. Simulation results illustrate the effectiveness of the proposed method.
  • Vahid Reza Nafisi, Manouchehr Eghbal, Mohammad Reza Jahed Motlagh, Fatemeh Yavari Page 36
    Fuzzy controllers are being used in various control schemes. The aim of this work was adjusting hemodialysis machine parameters utilizing a fuzzy logic controller (FLC) so that patient’s hemodynamic condition remains stable during hemodialysis treatment. For this purpose a comprehensive mathematical model of the arterial pressure response during hemodialysis, including hemodynamic, osmotic, and regulatory phenomena was used. The multi input multi output (MIMO) fuzzy logic controller receives three parameters from the model (heart rate, arterial blood pressure, and relative blood volume) as input. According to the changes in controller input values and its rule base, the outputs change so that the patient’s hemodynamic condition remains stable. The results of simulations illustrate that applying the controller can improve the stability of patient’s hemodynamic condition during hemodialysis treatment and also decreases treatment time. Furthermore with using fuzzy logic there is no need to have prior knowledge about the system under control and FLC is compatible with different patients.
  • Maryam Taghizadeh Dehkordi, Saeed Sadri, Alimohamad Doosthoseini Page 49
    Coronary heart disease has been one of the main threats of human health. Coronary angiography is taken as the “gold standard” for the assessment of coronary artery disease. But sometimes, the images are difficult to interpret visually because of the crossing and overlapping of the vessels in the angiograms. Vessel extraction from X-ray angiograms has been a challenging problem for several years. There are several problems for extraction of vessels: weak contrast between the coronary arteries and the background, unknown and easily deformable shape of the vessel tree, sometimes overlapping strong shadows of bones and so on. In this paper we investigate coronary vessel extraction and enhancement techniques and algorithms, and present capabilities of the most important algorithms concerning coronary vessel segmentation.
  • Mahmoud Saghaei Page 55
    Randomization is an essential component of sound clinical trials which prevents selection biases and helps in blinding the allocations. Randomization is a process by which subsequent subjects enrolled into trial groups only by chance which is essentially eliminates selection biases. A serious consequence of randomization is severe imbalance among treatment groups with respect to some prognostic factors, which invalidate the trial results or necessitate complex and usually unreliable secondary analysis to eradicate the source of imbalances. Minimization on the other hand tends to allocate in a way to minimize the differences among groups with respect to prognostic factors. Pure minimization is therefore completely deterministic, that is one can predict the allocation of the next subject by knowing the factor levels of previously enrolled subject and having the properties of the next subject. To eliminate predictability of randomization, it is necessary to include some elements of randomness in the minimization algorithms. In this article brief descriptions of randomization and minimization are presented followed by introducing selected randomization and minimization programs.
  • Farzaneh Shayegh, Rasoul Amirfattahi, Saeed Sadri, Karim Ansari, Asl Page 62
    In recent decades, seizure prediction caused a lot of research in both signal processing and neuroscience field. The researches tried to enhance the conventional seizure prediction algorithms such that rate of the false alarms be appropriately small so that seizures can be predicted according to clinical standards. Up to now none of the proposed algorithms have been sufficiently adequate. In this paper we show that considering mechanism of generation of seizure, the prediction results may be improved. For this purpose, an algorithm based on identification of parameters of a physiological model of seizure is introduced. Some models of EEG signals that potentially can also be considered as models of seizure and some developed seizure models are reviewed. As an example a model of depth-EEG signals proposed by Wendling is studied and is shown to be a suitable model
  • Keyvan Jabbari Page 73
    An important need in radiation therapy is a fast and accurate treatment planning system. This system using CT data and characteristics of the radiation beam, calculates the dose in all points of the patient’s volume. Two main factors in planning system are “accuracy” and “speed”. According to these factors, various generation of treatment planning systems are developed. This article is the review of Fast Monte Carlo treatment planning systems which are accurate and fast have both factors at the same time. Monte Carlo techniques are based on the transport of each individual particle (e.g. photon or electron) in the tissue. The transport of the particle is done using the physics of interactions of the particles with matter. Other techniques transport the particles as a group. For a typical dose calculation in radiation therapy the code has to transport some few million particles which take few hours Therefore the Monte Carlo techniques are accurate but slow for clinical use. In recent years with development of the “fact” Monte Carlo the system one is able to perform dose calculation in reasonable time for clinical use. The acceptable time for dose calculation is in the range of the one minute. There is currently a growing interest in fast Monte Carlo treatment planning systems and there are many commercial treatment planning systems the perform dose calculation in radiation therapy based on Monte Carlo technique.